AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
TPU efficient training

# TPU efficient training

Llama 3 Open Ko 8B Instruct Preview
Other
A Korean language model based on continued pre-training of Llama-3-8B, trained on 60GB+ deduplicated publicly available text, supporting Korean and English.
Large Language Model Transformers Supports Multiple Languages
L
beomi
6,014
60
Roberta Swedish
This is a Swedish pre-trained model based on the RoBERTa architecture, suitable for various natural language processing tasks.
Large Language Model
R
birgermoell
54
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase